Towards Continuous Sign Language Recognition with Deep Learning

نویسندگان

  • Boris Mocialov
  • Graham Turner
  • Katrin Lohan
  • Helen Hastie
چکیده

Humans communicate with each other using abstract signs and symbols. While the cooperation between humans and machines can be a powerful tool for solving complex or difficult tasks, the communication must be at the abstract enough level that is both natural to the humans and understandable to the machines. Our paper focuses on natural language and in particular on sign language recognition. The approach described here combines heuristics for segmentation of the video stream by identifying the epenthesis with stacked LSTMs for automatic classification of the derived segments. This approach segments continuous stream of video data with the accuracy of over 80% and reaches accuracies of over 95% on segmented sign recognition. We compare results in terms of the number of signs being recognised and the utility of various features used for the recognition. We aim to integrate the models into a single continuous sign language recognition system and to learn policies for specific domains that would map perception of a robot to its action. This will improve the accuracy of understanding the common task within the shared activity between a human and a machine. Such understanding, in turn, will foster meaningful cooperation.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Proposal for a Deep Learning Architecture for Activity Recognition

Activity recognition from computer vision plays an important role in research towards applications like human computer interfaces, intelligent environments, surveillance or medical systems. In this paper, we propose a gesture recognition system based on a deep learning architecture and show how it performs when trained with changing multimodal input data on an Italian sign language dataset. The...

متن کامل

Towards a Video Corpus for Signer-Independent Continuous Sign Language Recognition

Research in the field of continuous sign language recognition has not yet addressed the problem of interpersonal variance in signing. Applied to signerindependent tasks, current recognition systems show poor performance as their training bases upon corpora with an insufficient number of signers. In contrast to speech recognition, there is actually no benchmark which meets the requirements for s...

متن کامل

Deep Sign: Hybrid CNN-HMM for Continuous Sign Language Recognition

This paper introduces the end-to-end embedding of a CNN into a HMM, while interpreting the outputs of the CNN in a Bayesian fashion. The hybrid CNN-HMM combines strong discriminative abilities of CNNs with sequence modeling capabilities of HMMs. Most current approaches in the field of gesture and sign language recognition disregard the necessity of dealing with sequence data both for training a...

متن کامل

Evaluation of Deep Learning based Pose Estimation for Sign Language

Human body pose estimation and hand detection being the prerequisites for sign language recognition(SLR), are both crucial and challenging tasks in Computer Vision and Machine Learning. There are many algorithms to accomplish these tasks for which the performance measures need to be evaluated for body posture recognition on a sign language dataset, that would serve as a baseline to provide impo...

متن کامل

Using Deep Convolutional Networks for Gesture Recognition in American Sign Language

In the realm of multimodal communication, sign language is, and continues to be, one of the most understudied areas. In line with recent advances in the field of deep learning, there are far reaching implications and applications that neural networks can have for sign language interpretation. In this paper, we present a method for using deep convolutional networks to classify images of both the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017